Goto

Collaborating Authors

 fourier sery


b250de41980b58d34d6aadc3f4aedd4c-Paper-Conference.pdf

Neural Information Processing Systems

It has been applied in areas such as breaking cryptographic systems [1], searching databases [2], and quantum simulation [3, 4], in which it gives a quantum speedup over the best known classical algorithms. With the fast development of quantum hardware, recent results [5-7] have shown quantum advantages in specific tasks.





A Additional Experiments

Neural Information Processing Systems

For more complicated periodic functions, see Figure 11 and 12. For a larger value of a, the extrapolation improves. We use Adam as the optimizer. The region within the dashed vertical lines are the range of the training set. Learning range is indicated by the blue vertical bars.



Sinusoidal Approximation Theorem for Kolmogorov-Arnold Networks

Gleyzer, Sergei, Nguyen, Hanh, Ramakrishnan, Dinesh P., Reinhardt, Eric A. F.

arXiv.org Machine Learning

The Kolmogorov-Arnold representation theorem states that any continuous multivariable function can be exactly represented as a finite superposition of continuous single variable functions. Subsequent simplifications of this representation involve expressing these functions as parameterized sums of a smaller number of unique monotonic functions. These developments led to the proof of the universal approximation capabilities of multilayer perceptron networks with sigmoidal activations, forming the alternative theoretical direction of most modern neural networks. Kolmogorov-Arnold Networks (KANs) have been recently proposed as an alternative to multilayer perceptrons. KANs feature learnable nonlinear activations applied directly to input values, modeled as weighted sums of basis spline functions. This approach replaces the linear transformations and sigmoidal post-activations used in traditional perceptrons. Subsequent works have explored alternatives to spline-based activations. In this work, we propose a novel KAN variant by replacing both the inner and outer functions in the Kolmogorov-Arnold representation with weighted sinusoidal functions of learnable frequencies. Inspired by simplifications introduced by Lorentz and Sprecher, we fix the phases of the sinusoidal activations to linearly spaced constant values and provide a proof of its theoretical validity. We also conduct numerical experiments to evaluate its performance on a range of multivariable functions, comparing it with fixed-frequency Fourier transform methods and multilayer perceptrons (MLPs). We show that it outperforms the fixed-frequency Fourier transform and achieves comparable performance to MLPs.


QFGN: A Quantum Approach to High-Fidelity Implicit Neural Representations

Jin, Hongni, Singh, Gurinder, Merz, Kenneth M. Jr

arXiv.org Artificial Intelligence

Implicit neural representations have shown potential in various applications. However, accurately reconstructing the image or providing clear details via image super-resolution remains challenging. This paper introduces Quantum Fourier Gaussian Network (QFGN), a quantum-based machine learning model for better signal representations. The frequency spectrum is well balanced by penalizing the low-frequency components, leading to the improved expressivity of quantum circuits. The results demonstrate that with minimal parameters, QFGN outperforms the current state-of-the-art (SOT A) models. Despite noise on hardware, the model achieves accuracy comparable to that of SIREN, highlighting the potential applications of quantum machine learning in this field.


KA-GNN: Kolmogorov-Arnold Graph Neural Networks for Molecular Property Prediction

Li, Longlong, Zhang, Yipeng, Wang, Guanghui, Xia, Kelin

arXiv.org Artificial Intelligence

As key models in geometric deep learning, graph neural networks have demonstrated enormous power in molecular data analysis. Recently, a specially-designed learning scheme, known as Kolmogorov-Arnold Network (KAN), shows unique potential for the improvement of model accuracy, efficiency, and explainability. Here we propose the first non-trivial Kolmogorov-Arnold Network-based Graph Neural Networks (KA-GNNs), including KAN-based graph convolutional networks(KA-GCN) and KAN-based graph attention network (KA-GAT). The essential idea is to utilizes KAN's unique power to optimize GNN architectures at three major levels, including node embedding, message passing, and readout. Further, with the strong approximation capability of Fourier series, we develop Fourier series-based KAN model and provide a rigorous mathematical prove of the robust approximation capability of this Fourier KAN architecture. To validate our KA-GNNs, we consider seven most-widely-used benchmark datasets for molecular property prediction and extensively compare with existing state-of-the-art models. It has been found that our KA-GNNs can outperform traditional GNN models. More importantly, our Fourier KAN module can not only increase the model accuracy but also reduce the computational time. This work not only highlights the great power of KA-GNNs in molecular property prediction but also provides a novel geometric deep learning framework for the general non-Euclidean data analysis.


FAN: Fourier Analysis Networks

Dong, Yihong, Li, Ge, Tao, Yongding, Jiang, Xue, Zhang, Kechi, Li, Jia, Su, Jing, Zhang, Jun, Xu, Jingjing

arXiv.org Artificial Intelligence

Despite the remarkable success achieved by neural networks, particularly those represented by MLP and Transformer, we reveal that they exhibit potential flaws in the modeling and reasoning of periodicity, i.e., they tend to memorize the periodic data rather than genuinely understanding the underlying principles of periodicity. However, periodicity is a crucial trait in various forms of reasoning and generalization, underpinning predictability across natural and engineered systems through recurring patterns in observations. In this paper, we propose FAN, a novel network architecture based on Fourier Analysis, which empowers the ability to efficiently model and reason about periodic phenomena. By introducing Fourier Series, the periodicity is naturally integrated into the structure and computational processes of the neural network, thus achieving a more accurate expression and prediction of periodic patterns. As a promising substitute to multi-layer perceptron (MLP), FAN can seamlessly replace MLP in various models with fewer parameters and FLOPs. Through extensive experiments, we demonstrate the effectiveness of FAN in modeling and reasoning about periodic functions, and the superiority and generalizability of FAN across a range of real-world tasks, including symbolic formula representation, time series forecasting, and language modeling.